-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reduce memory inefficiencies and leaks #95
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…at disposes frees the handle to the underlying managed object
…ons to pass the instance and the method name Both a FunctionTemplate and the concrete Function created from it live at the v8::Context level. Additionally both are independent from the concrete object on which we call the method. Therefore we save a ton of allocations by caching those functions by assembly qualified name and method name. Before we were creating a FunctionTemplate and Function for each method on each object instance which led to hundreds of megabytes of allocated space when creating one million objects and calling a single method on them.
…n to a list of JavascriptFunction handles in JavascriptContext - makes it safe to call Dispose (~JavascriptFunction) explicitly and to let the GC remove references of undisposed references (!JavascriptFunction) - simplifies code in JavascriptContext which now does not need to know about JavascriptFunction handles that the GC takes care of anyway - the Reset method on the Persistent<Function> handle is irrelevant after the v8::Context is disposed because the underlying memory has been freed by V8
oliverbock
approved these changes
Apr 26, 2021
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This all looks good to me, though it's been a long time since I have been in this code.
Co-authored-by: oliverbock <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Reduce memory inefficiencies and leaks
This PR is an effort to reduce various memory inefficiencies and leaks that become mostly apparent if you reuse
JavascriptContext
objects instead of disposing them after each use. But even if you dispose a context after each use there is a chance you will benefit from these changes, especially if you have a longer running operation that creates a lot of managed objects and calls methods on them. It also remedies the occasional crashes reported e.g. in #90 (see Commit 5).The PR includes a couple of commits, each of which solves an isolated improvement. I will go over each commit and provide test code, as well as before and after memory usage curves. Overall method used:
Test code
The following test code was used to discover and solve all inefficiencies. The overall structure was the same for all tests, only a couple of lines were added after each commit. Each test case is marked with a comment that will be referenced in the explainer for the corresponding commit. Also, each breakpoint location I used is marked with a comment.
Commit 1 - Set up a callback function that runs when a JS object becomes weak
In this scenario we create a lot of managed
Product
objects from JS code.Since each object is scoped only to one iteration of the
for
-loop the .NET garbage collector (GC) should be able to collect these objects while the loop is running, resulting in a flat memory curve. However, each object is held with a strong GC handle hereJavascript.Net/Source/Noesis.Javascript/JavascriptExternal.cpp
Line 53 in a5ebde3
JavascriptContext
cleans up allWrappedExternal
objects in its destructorJavascript.Net/Source/Noesis.Javascript/JavascriptContext.cpp
Lines 176 to 177 in a5ebde3
JavascriptExternal
freeing the GC handle.Memory curve before (1,000,000 iterations; 36.28s runtime in the diagnostic session)
To solve this problem we keep track of a
v8::Persistent
for the object and set a callback function on it that runs when V8 deems an object as weak (i.e. the object only has weak references left in the V8 context, if any at all) using theSetWeak
method. For this we use a new helper methodWrap
onJavascriptExternal
that sets the internal field of thev8::Local
wrapper object, initializes thev8::Persistent
for it, and sets the callback.When the callback runs we a) remove the
WrappedExternal
from the tracked externals inJavascriptContext
and b) free the underlying pointer which leads to the execution of theJavascriptExternal
destructor and therefore the freeing of the GC handle. The GC is now able to collect the managed object.Memory curve after (1,000,000 iterations; 36s runtime in the diagnostic session)
Commit 2 - Cache WrappedMethod at context level
In this scenario we additionally call two dictinct methods on each created object. One of those methods is called twice.
Each
JavascriptExternal
holds a cache ofWrappedMethod
objects with each method that was called on the underlying managed object hereJavascript.Net/Source/Noesis.Javascript/JavascriptExternal.cpp
Line 55 in a5ebde3
Javascript.Net/Source/Noesis.Javascript/JavascriptExternal.cpp
Lines 88 to 91 in a5ebde3
Memory curve before (200,000 iterations; 33.93s runtime in the diagnostic session)
We do not need to cache in this way however, because both a
v8::FunctionTemplate
and the concretev8::Function
live at thev8::Context
level and are additionally totally independent from the concrete object on which we call the method. Think V8 calling a method likefunctionInstance.call(objectInstance, <arguments>)
. Therefore we can save a ton of allocations by caching those functions by assembly qualified name and method name at theJavascriptContext
level instead, and reusing them for different object instances.The only thing we must provide when creating the
v8::FunctionTemplate
is the method name so that theJavascriptInterop::Invoker
method can find the method by reflection hereJavascript.Net/Source/Noesis.Javascript/JavascriptInterop.cpp
Line 790 in a5ebde3
iArgs.Holder()
object. The internal field is then a reference to the concreteJavascriptExternal
from which we can get the managed object.This change had the overall biggest impact.
Memory curve after (200,000 iterations; 18.46s runtime in the diagnostic session)
Commits 3 and 4 - Reduce wrapping in iterator callbacks and use method cache
In this scenario we use a
for-of
loop to iterate through a collection on each object. The collection is returned by yet another method we call.These commits align the creation of iterator callbacks with the technique used in Commit 2. They don't improve memory usage measurably in the above test, but simplify and unify code structure.
Commit 5 - Prefer a weak reference to the JavascriptContext in JavascriptFunction
In this scenario we additionally call a method that has a
JavascriptFunction
callback as parameter that is not disposed explicitly.Instead of holding a list of
JavascriptFunction
handles in theJavascriptContext
hereJavascript.Net/Source/Noesis.Javascript/JavascriptContext.cpp
Line 163 in a5ebde3
JavascriptContext
inJavascriptFunction
. This improves memory usage only slightly, but more importantly it now makes it safe for the GC to collect undisposedJavascriptFunction
objects after theJavascriptContext
has been disposed. This led to memory access violations before, which meant that embedders must callDispose
explicitly on everyJavascriptFunction
. That's not always easily possible and really not necessary. This commit is therefore also a better fix for #90.It also simplifies code in
JavascriptContext
which now does not need to know about JavascriptFunction handles that the GC takes care of anyway. Instead we just check if theJavascriptContext
the function was created in is still "alive" and only call theReset
method on thev8::Persistent
in that case (see~JavascriptFunction
). If the V8 context has already been disposed the call toReset
is irrelevant because the underlying memory has already been freed.Memory curve before (200,000 iterations; 37.32s runtime in the diagnostic session)
Memory curve after (200,000 iterations; 33.54s runtime in the diagnostic session)
Conclusion
When not using any JS callbacks in managed code (i.e.
JavascriptFunction
instances), we have now a completely flat curve when creating thousands of managed objects from JS and calling methods on them. We still leak something when callbacks are involved and the memory use is therefore still increasing in the scenario of Commit 5. But compared to what we had before this is a great improvement. I haven't had the chance yet to analyze the remaining leak and would do that if I find time at a later date in another PR. For now I'm quite happy with the results.Running the complete test code without the callback scenario on the master branch we get the following curve (200,000 iterations; 46.17s runtime in the diagnostic session)
Running the complete test code without the callback scenario with this PR we get the following curve (200,000 iterations; 26.17s runtime in the diagnostic session)